Comments about the article in Nature: This AI learnt language by seeing the world through a baby’s eyes
Following is a discussion about this article in Nature Vol 626 15 February 2024, by Elizabeth Gibney
To study the full text select this link:
https://www.nature.com/articles/d41586-024-00288-1
- The text in italics is copied from the article
- Immediate followed by some comments
In the last paragraph I explain my own opinion.
Reflection
Introduction
-
-
-
-
-
-
-
-
-
-
1. Baby’s-eye views
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
2. Lessons about learning
-
-
-
-
-
-
-
Real-world language learning is much richer and more varied than the AI experienced.
-
-
The researchers say that, because the AI is limited to training on still images and written text, it could not experience interactions that are inherent to a real baby’s life.
-
-
The AI struggled to learn the word ‘hand’, for example, which is usually learnt early in an infant’s life, says Vong.
-
-
“Babies have their own hands, they have a lot of experience with them. That’s definitely a missing component of our model.”
-
-
-